The Hybrid BFGS-CG Method in Solving Unconstrained Optimization Problems
نویسندگان
چکیده
منابع مشابه
A New Scaled Hybrid Modified BFGS Algorithms for Unconstrained Optimization
The BFGS methods is a method to solve an unconstrained optimization. Many modification have been done for solving this problems. In this paper, we present a new scaled hybrid modified BFGS. The new scaled hybrid modified BFGS algorithms are proposed and analyzed. The scaled hybrid modified BFGS can improve the number of iterations. Results obtained by the hybrid modified BFGS algorithms are com...
متن کاملThe Algorithms of Broyden-CG for Unconstrained Optimization Problems
The conjugate gradient method plays an important role in solving large-scaled problems and the quasi-Newton method is known as the most efficient method in solving unconstrained optimization problems. Therefore, in this paper, the new hybrid 2592 Mohd Asrul Hery Ibrahim et al. method between the conjugate gradient method and the quasi-newton method for solving optimization problem is suggested....
متن کاملOn the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
This paper is concerned with the open problem whether BFGS method with inexact line search converges globally when applied to nonconvex unconstrained optimization problems. We propose a cautious BFGS update and prove that the method with either Wolfe-type or Armijo-type line search converges globally if the function to be minimized has Lipschitz continuous gradients.
متن کاملA regularized limited-memory BFGS method for unconstrained minimization problems
The limited-memory BFGS (L-BFGS) algorithm is a popular method of solving large-scale unconstrained minimization problems. Since LBFGS conducts a line search with the Wolfe condition, it may require many function evaluations for ill-posed problems. To overcome this difficulty, we propose a method that combines L-BFGS with the regularized Newton method. The computational cost for a single iterat...
متن کاملA Free Line Search Steepest Descent Method for Solving Unconstrained Optimization Problems
In this paper, we solve unconstrained optimization problem using a free line search steepest descent method. First, we propose a double parameter scaled quasi Newton formula for calculating an approximation of the Hessian matrix. The approximation obtained from this formula is a positive definite matrix that is satisfied in the standard secant relation. We also show that the largest eigen value...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Abstract and Applied Analysis
سال: 2014
ISSN: 1085-3375,1687-0409
DOI: 10.1155/2014/507102